Large-Scale Optimization with Linear Equality Constraints Using Reduced Compact Representation

نویسندگان

چکیده

Related DatabasesWeb of Science You must be logged in with an active subscription to view this.Article DataHistorySubmitted: 27 January 2021Accepted: 08 September 2021Published online: 13 2022Keywordslarge-scale optimization, compact representation, trust-region method, limited memory, LSQR, sparse QRAMS Subject Headings68Q25, 68R10, 68U05Publication DataISSN (print): 1064-8275ISSN (online): 1095-7197Publisher: Society for Industrial and Applied MathematicsCODEN: sjoce3

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Optimization Problems Under One-sided (max,min)-Linear Equality Constraints

In this article we will consider optimization problems, the objective function of which is equal to the maximum of a finite number of continuous functions of one variable. The set of feasible solutions is described by the system of (max,min)-linear equality constraints with variables on one side. Complexity of the proposed method with monotone or unimodal functions will be studied, possible gen...

متن کامل

Nonlinear Robust Optimization with Uncertain Equality Constraints

The problem of nonlinear optimization for process design under uncertainty is addressed in this paper. A novel robust optimization framework is proposed to address general nonlinear problems under uncertainty. To address the limitation of single point linearization with respect to uncertain parameters for a large uncertainty region, an iterative algorithm is developed. The new method applies lo...

متن کامل

Adaptive Augmented Lagrangian Methods for Large-Scale Equality Constrained Optimization

We propose an augmented Lagrangian algorithm for solving large-scale equality constrained optimization problems. The novel feature of the algorithm is an adaptive update for the penalty parameter motivated by recently proposed techniques for exact penalty methods. This adaptive updating scheme greatly improves the overall performance of the algorithm without sacrificing the strengths of the cor...

متن کامل

A Random Coordinate Descent Method on Large-scale Optimization Problems with Linear Constraints

In this paper we develop a random block coordinate descent method for minimizing large-scale convex problems with linearly coupled constraints and prove that it obtains in expectation an ε-accurate solution in at most O( 1 ε ) iterations. However, the numerical complexity per iteration of the new method is usually much cheaper than that of methods based on full gradient information. We focus on...

متن کامل

Small-Data, Large-Scale Linear Optimization

Optimization applications often depend upon a huge number of uncertain parameters. In many contexts, however, the amount of relevant data per parameter is small, and hence, we may have only imprecise estimates. We term this setting – where the number of uncertainties is large, but all estimates have fixed and low precision – the “small-data, large-scale regime.” We formalize a model for this re...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: SIAM Journal on Scientific Computing

سال: 2022

ISSN: ['1095-7197', '1064-8275']

DOI: https://doi.org/10.1137/21m1393819